Okay, let's do more.
Well, you can write down blobs and arrows and numbers until you're green in the face.
We also have to say what this all means.
And then of course, what you do with it.
And as you've learned from looking at something as simple as logic, what you do with it should
somehow be guided by what it all means.
Okay.
So what is the meaning of this graph?
Well, there are dependencies.
Alarm, here's a couple of things this means.
Alarm depends on burglary and earthquake.
And also, if you want to know something like the probability that Mary calls given an alarm
and a burglary, then you know that there is a conditional independence so I can drop the
burglary.
So the essential thing here, again, is the arrows we don't have.
The fewer arrows we have, the less dependencies we have, the more conditional dependencies
we have, the more we can do with this.
A fully connected graph, a fully connected Bayesian network is no gain to the full probability
distribution, full conditional probability distribution.
Okay.
Anything that's less than a fully connected graph is actually better.
And this is a very unconnected graph, so we're very happy.
Okay.
Here is, and it's a bit of a mouthful, here is what really the conditional dependencies
are.
We can say each node in a Bayesian network is conditionally independent of all of its
non-descendants.
What are the non-descendants of X?
Or what are the descendants of X?
Yes?
Yes, these.
That would be the children also.
The descendants are what also goes much more down.
So anything below the Ys also would be a descendant.
So non-descendants is everything outside kind of this cone that goes down there.
So in this case, the Zs and the Us.
But we have something else here.
We have conditionally independent of its non-descendants given its parents.
So given the Us, X is conditionally independent of the Zs here.
But not only the Zs, anything that's outside the gray zone as well.
All the ancestors of the Us and so on, it's also conditionally independent of, given its
parents.
So if you think about having a huge graph, then that's quite a lot of conditional independence.
And that's what makes these things nice.
So this is kind of the picture you should have in your mind when you think of the independences.
And you can see the more arrows we get, the weaker the conditional independences become.
Yes?
Conditionally independent of what?
And of John calls.
And so on, yes.
Presenters
Zugänglich über
Offener Zugang
Dauer
00:15:22 Min
Aufnahmedatum
2021-02-01
Hochgeladen am
2021-03-29 12:26:12
Sprache
en-US
The semantics of a Bayesian Network get explained. Also, the recovering of the full joint probability distribution from a Bayesian Network is discussed.